Goto

Collaborating Authors

 low-rank regression



Reviews: Low-Rank Regression with Tensor Responses

Neural Information Processing Systems

Strength: --The paper provides the theoretical analysis of approximation guarantees and a generalization bound for the class of tensor-valued regression functions. Weakness: --A major drawback is that the novelty and contribution is rather limited. The key idea and the model of this paper is actually equivalent to the HOPLS in the following paper: [Zhao et. In HOPLS, it assumes the tensor input has low-rank structure and also the tensor output has low-rank structure, and the link of them is established in the common latent space. And then follows a regression step against the projected latent variables.


Low-Rank Regression with Tensor Responses

Neural Information Processing Systems

This paper proposes an efficient algorithm (HOLRR) to handle regression tasks where the outputs have a tensor structure. We formulate the regression problem as the minimization of a least square criterion under a multilinear rank constraint, a difficult non convex problem. HOLRR computes efficiently an approximate solution of this problem, with solid theoretical guarantees. A kernel extension is also presented. Experiments on synthetic and real data show that HOLRR computes accurate solutions while being computationally very competitive.


Low-Rank Regression with Tensor Responses

Rabusseau, Guillaume, Kadri, Hachem

Neural Information Processing Systems

This paper proposes an efficient algorithm (HOLRR) to handle regression tasks where the outputs have a tensor structure. We formulate the regression problem as the minimization of a least square criterion under a multilinear rank constraint, a difficult non convex problem. HOLRR computes efficiently an approximate solution of this problem, with solid theoretical guarantees. A kernel extension is also presented. Experiments on synthetic and real data show that HOLRR computes accurate solutions while being computationally very competitive.